30 research outputs found

    Magic Pointing for Eyewear Computers

    Get PDF

    Pupil Center as a Function of Pupil Diameter

    Get PDF

    TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies

    Full text link
    In this paper we present the TobiiGlassesPySuite, an open-source suite we implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom eye-tracking studies. We provide a platform-independent solution for controlling the device and for managing the recordings. The software consists of Python modules, integrated into a single package, accompanied by sample scripts and recordings. The proposed solution aims at providing additional methods with respect to the manufacturer's software, for allowing the users to exploit more the device's capabilities and the existing software. Our suite is available for download from the repository indicated in the paper and usable according to the terms of the GNU GPL v3.0 license

    6th international workshop on pervasive eye tracking and mobile eye-based interaction

    Get PDF
    Previous work on eye tracking and eye-based human-computer interfaces mainly concentrated on making use of the eyes in traditional desktop settings. With the recent growth of interest in wearable computers, such as smartwatches, smart eyewears and low-cost mobile eye trackers, eye-based interaction techniques for mobile computing are becoming increasingly important. PETMEI 2016 focuses on the pervasive eye tracking paradigm as a trailblazer for mobile eye-based interaction to take eye tracking out into the wild, to mobile and pervasive settings. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking

    An investigation of the distribution of gaze estimation errors in head mounted gaze trackers using polynomial functions

    Get PDF
    Second order polynomials are commonly used for estimating the point-of-gaze in head-mounted eye trackers. Studies in remote (desktop) eye trackers show that although some non-standard 3rd order polynomial models could provide better accuracy, high-order polynomials do not necessarily provide better results. Different than remote setups though, where gaze is estimated over a relatively narrow field-of-view surface (e.g. less than 30x20 degrees on typical computer displays), head-mounted gaze trackers (HMGT) are often desired to cover a relatively wider field-of-view to make sure that the gaze is detected in the scene image even for extreme eye angles. In this paper we investigate the behavior of the gaze estimation error distribution throughout the image of the scene camera when using polynomial functions. Using simulated scenarios, we describe effects of four different sources of error: interpolation, extrapolation, parallax, and radial distortion. We show that the use of third order polynomials result in more accurate gaze estimates in HMGT, and that the use of wide angle lenses might be beneficial in terms of error reduction

    A gaze interactive assembly instruction with pupillometric recording

    Get PDF
    This paper presents a study of a gaze interactive digital assembly instruction that provides concurrent logging of pupil data in a realistic task setting. The instruction allows hands-free gaze dwells as a substitute for finger clicks, and supports image rotation as well as image zooming by head movements. A user study in two LEGO toy stores with 72 children showed it to be immediately usable by 64 of them. Data logging of view-times and pupil dilations was possible for 59 participants. On average, the children spent half of the time attending to the instruction (S.D. 10.9%). The recorded pupil size showed a decrease throughout the building process, except when the child had to back-step: a regression was found to be followed by a pupil dilation. The main contribution of this study is to demonstrate gaze-tracking technology capable of supporting both robust interaction and concurrent, non-intrusive recording of gaze- and pupil data in-the-wild. Previous research has found pupil dilation to be associated with changes in task effort. However, other factors like fatigue, head motion, or ambient light may also have an impact. The final section summarizes our approach to this complexity of real-task pupil data collection and makes suggestions for how future applications may utilize pupil information

    BimodalGaze:Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement

    Get PDF
    Eye gaze is a fast and ergonomic modality for pointing but limited in precision and accuracy. In this work, we introduce BimodalGaze, a novel technique for seamless head-based refinement of a gaze cursor. The technique leverages eye-head coordination insights to separate natural from gestural head movement. This allows users to quickly shift their gaze to targets over larger fields of view with naturally combined eye-head movement, and to refine the cursor position with gestural head movement. In contrast to an existing baseline, head refinement is invoked automatically, and only if a target is not already acquired by the initial gaze shift. Study results show that users reliably achieve fine-grained target selection, but we observed a higher rate of initial selection errors affecting overall performance. An in-depth analysis of user performance provides insight into the classification of natural versus gestural head movement, for improvement of BimodalGaze and other potential applications

    Effect of aging on post-saccadic oscillations

    Get PDF
    Recent research have shown that the eye movement data measured by an eye tracker does not necessarily reflect the exact rotations of the eyeball. For example, post-saccadic eye movements may be more reflecting the relative movements between the pupil and the iris rather than the eyeball oscillations. Since, accurate measurement of eye movements is important in many studies, it is crucial to identify different factors that influence the dynamics of the eye movements measured by an eye tracker. It has been shown that deformation of the internal structure of the iris and size of the pupil directly affect the amplitude of the post-saccadic oscillations that are measured by video-based eye trackers that are pupil-based. In this paper, we look at the effect of aging on post-saccadic oscillations. We recorded eye movements from a group of 43 young and 22 older participants during an abstract and a more natural viewing task. The recording was conducted with a video-based eye tracker using the pupil center and corneal reflection. We anticipated that changes in the muscle strength as an effect of aging might affect, directly or indirectly, the post-saccadic oscillations. Results showed that the size of the post-saccadic oscillations were significantly larger for our older group. The results suggests that aging has to be considered as an important factor when studying the post-saccadic eye movements
    corecore